- norm penalised orthogonal forward regression

نویسندگان

  • Xia Hong
  • Sheng Chen
  • Yi Guo
  • Junbin Gao
چکیده

Xia Hong, Sheng Chen, Yi Guo and Junbin Gao Department of Computer Science, School of Mathematical, Physical and Computational Sciences, University of Reading, Reading, UK; Electronics and Computer Science, University of Southampton, Southampton, UK; Department of Electrical and Comptuer Engineering , Faculty of Engineering, King Abdulaziz University, Jeddah, Saudi Arabia; CSIRO Mathematics and Information Sciences, North Ryde, Australia; Discipline of Business Analytics, University of Sydney Business School, University of Sydney, Camperdown, Australia

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

L1-norm Penalised Orthogonal Forward Regression

A l-norm penalized orthogonal forward regression (l-POFR) algorithm is proposed based on the concept of leaveone-out mean square error (LOOMSE). Firstly, a new l-norm penalized cost function is defined in the constructed orthogonal space, and each orthogonal basis is associated with an individually tunable regularization parameter. Secondly, due to orthogonal computation, the LOOMSE can be anal...

متن کامل

Sparse model identification using orthogonal forward regression with basis pursuit and D-optimality - Control Theory and Applications, IEE Proceedings-

An efficient model identification algorithm for a large class of linear-in-the-parameters models is introduced that simultaneously optimises the model approximation ability, sparsity and robustness. The derived model parameters in each forward regression step are initially estimated via the orthogonal least squares (OLS), followed by being tuned with a new gradient-descent learning algorithm ba...

متن کامل

An Orthogonal Forward Regression Algorithm Combined with Basis Pursuit and D-optimality

A new forward regression model identification algorithm is introduced. The derived model parameters, in each forward regression step, are initially estimated via orthogonal least squares (OLS) (using the modified Gram-Schmidt procedure), followed by being tuned with a new gradient descent learning algorithm based on the basis pursuit that minimizes the norm of the parameter estimate vector. The...

متن کامل

A Note on the Lasso for Gaussian Graphical Model Selection

Inspired by the success of the Lasso for regression analysis (Tibshirani, 1996), it seems attractive to estimate the graph of a multivariate normal distribution by `1-norm penalised likelihood maximisation. The objective function is convex and the graph estimator can thus be computed efficiently, even for very large graphs. However, we show in this note that the resulting estimator is not consi...

متن کامل

A semi-automatic method to guide the choice of ridge parameter in ridge regression

We consider the application of a popular penalised regression method, Ridge Regression, to data with very high dimensions and many more covariates than observations. Our motivation is the problem of out-of-sample prediction and the setting is high-density genotype data from a genome-wide association or resequencing study. Ridge regression has previously been shown to offer improved performance ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017